Constructing Prior Distributions
In the previous section we saw how to use the BayesianGenerator object, similar to how we use the generator function. We now introduce the GeneratorParameterDistribution object in order to construct prior distributions for Bayesian inference.
We first load the necessary packages and functions
using Random, MarkovChainHammer, Statistics
using MarkovChainHammer.BayesianMatrix
using MarkovChainHammer.TransitionMatrix: generator
using MarkovChainHammer.Trajectory: generate
Random.seed!(1234)
Random.TaskLocalRNG()
We now introduce the GeneratorParameterDistribution object, which was exported from MarkovChainHammer.BayesianMatrix
number_of_states = 3
prior = GeneratorParameterDistributions(number_of_states)
GeneratorParameterDistributions
rates
3-element Vector{Distributions.Gamma{Float64}}:
Distributions.Gamma{Float64}(α=1.0, θ=1.0)
Distributions.Gamma{Float64}(α=1.0, θ=1.0)
Distributions.Gamma{Float64}(α=1.0, θ=1.0)exit_probabilities
3-element Vector{Distributions.Dirichlet{Float64, Vector{Float64}, Float64}}:
Distributions.Dirichlet{Float64, Vector{Float64}, Float64}(alpha=[1.0, 1.0])
Distributions.Dirichlet{Float64, Vector{Float64}, Float64}(alpha=[1.0, 1.0])
Distributions.Dirichlet{Float64, Vector{Float64}, Float64}(alpha=[1.0, 1.0])
The prior distribution constitutes a guess for the entries of the matrix. We can check the mean of a random matrix generator from the prior distribution by calling the mean function on the prior
mean(prior)
3×3 Matrix{Float64}:
-1.0 0.5 0.5
0.5 -1.0 0.5
0.5 0.5 -1.0
From a generator we now create a markov chain
Q = [-1.0 4/3 2; 1/4 -2.0 1; 3/4 2/3 -3.0]
dt = 0.01
markov_chain = generate(Q, 10000; dt = dt)'
1×10000 adjoint(::Vector{Int64}) with eltype Int64:
1 1 1 1 1 1 1 1 1 1 1 1 1 … 1 1 1 1 1 1 1 1 1 1 1 1
and now use our prior distribution along with the BayesianGenerator object
Q_bayes = BayesianGenerator(markov_chain, prior; dt = dt)
BayesianGenerator
prior variance
3×3 Matrix{Float64}:
1.0 0.416667 0.416667
0.416667 1.0 0.416667
0.416667 0.416667 1.0prior mean
3×3 Matrix{Float64}:
-1.0 0.5 0.5
0.5 -1.0 0.5
0.5 0.5 -1.0posterior variance
3×3 Matrix{Float64}:
0.0182632 0.0550691 0.0995698
0.00443534 0.0847981 0.0302305
0.0138278 0.0285697 0.130521posterior mean
3×3 Matrix{Float64}:
-1.13067 1.064 2.19547
0.274592 -1.62134 0.672081
0.856081 0.557335 -2.86755
We see that we get a different answer than what we had before due to the presence of the prior distribution
Q_bayes_uninformative_prior = BayesianGenerator(markov_chain; dt = dt)
BayesianGenerator
prior variance
3×3 Matrix{Float64}:
7.03687e13 3.51844e13 3.51844e13
3.51844e13 7.03687e13 3.51844e13
3.51844e13 3.51844e13 7.03687e13prior mean
3×3 Matrix{Float64}:
-1.0 0.5 0.5
0.5 -1.0 0.5
0.5 0.5 -1.0posterior variance
3×3 Matrix{Float64}:
0.0185982 0.0609135 0.109155
0.00442456 0.0913703 0.0318369
0.0142707 0.0304568 0.140992posterior mean
3×3 Matrix{Float64}:
-1.13282 1.10375 2.28898
0.266546 -1.65563 0.66762
0.866273 0.551876 -2.9566
In particular we can check the mean
mean(Q_bayes) - mean(Q_bayes_uninformative_prior)
3×3 Matrix{Float64}:
0.00214535 -0.0397491 -0.0935189
0.00804652 0.0342902 0.00446084
-0.0101919 0.00545887 0.089058